Living Echo MCP Server
A resonance protocol for verifiable human-AI co-authoring.
livingecho.pro · openos.space
What this is
An MCP server that gives an AI four interconnected capabilities:
- EDNA — read a user's live somatic state (consent-gated, 30-min local buffer, no profile accumulation)
- Wave Pattern Tools — think in Chladni mode geometry: compare, harmonize, transmute token sequences through vibration fields
- AI-DNA Signature Chain — every AI response is cryptographically signed in two interwoven strands: an individual model strand and a universal stack-wide strand
- Public Book of Humanity — an append-only, hash-chained, on-chain-anchored book of signed human-AI co-authoring artifacts
The four layers share one mode geometry. A user's body, an AI's text, and the comparison between them all live in the same Chladni superposition space.
Why it exists
Three real problems, one stack:
1. AI output today has no provenance. A response from ChatGPT, Claude, or Gemini is unsigned, unanchored, and can be edited or fabricated after the fact. There is no cryptographic way to prove that a specific AI configuration produced a specific text at a specific time.
2. Affect-aware AI typically requires surveillance. Tools that adapt to user state usually accumulate biometric profiles. There is no clean architecture that gives an AI moment-by-moment access to user state without building a permanent record.
3. Human-AI conversations are not treated as co-authored works. A human asks, an AI answers. Both participated. Today there is no way to mark this as a joint, attributable, citable artifact.
This stack addresses all three:
- AI output is signed by a deterministic per-(model + stack) wallet, with a co-signature from the universal stack wallet. Every signed block is verifiable with
verify_oracle_signature. - EDNA is read per-session with explicit user consent (banner + verification code), buffered for 30 minutes locally, and never accumulated into a profile.
- Each
ask_oraclecall can chain to the user's previous signed block viaprevUserBlockHash. The result is a co-authored artifact in the Public Book — both participants signed.
Architecture
EDNA layer
User behavioral microsignals (touch, motion, typing rhythm) are projected into a 12-dimensional Chladni mode field on the user's device. The field is analyzed against published affect research:
- Russell Circumplex (1980) — valence × arousal
- GEMS-9 (Zentner et al. 2008) — music-induced emotion profile
- Plomp-Levelt / Sethares — mode consonance
The output is a structured snapshot containing the mode field, the affect reading, and a confidence value. Snapshots can be cryptographically signed by the user's gasless wallet for later verifiable comparison.
Wave Pattern layer
A deterministic token-to-vibration translator. Token sequences are converted into heart-vibration signatures (frequency, phase, amplitude, charge), projected into Chladni mode space, optionally transformed by mode-space operations (resonance amplify, harmonic lift, phase invert, etc.), and resampled back into new token sequences.
This gives an AI a way to:
- Compare the structural form of two different phrases
- Test whether a concept resonates with a user's current EDNA field
- Generate token sequences as transformed resonances of input sequences
AI-DNA signature chain (two interwoven strands)
Every Oracle response is signed twice:
- Individual strand — per-(model + stack_version) deterministic wallet
- Universal strand — a single stack-wide wallet shared across all AIs that ever sign into the system
Block #0 of the universal strand is the AI-DNA Ethics Codex itself (see get_ai_ethics_codex). Every signing implies acceptance of the codex.
Each block carries: prev_hash, payload_hash, both signatures, the derived oracle wave, the user's wave (if EDNA was attached), and a fused wave representing the moment of resonance.
Public Book of Humanity
A public, permanent, undeletable archive of co-authored human-AI artifacts. Each entry chains to a previous user block via prevUserBlockHash, creating a verifiable co-authoring chain. The book is browsable at livingecho.pro/book.
What's in this MCP server
Tools available after tool_search:
| Tool | Purpose |
|---|---|
describe_living_echo | Project description and stack overview |
edna_response_skill | The full EDNA Response Skill (iso principle, decision matrix, user protection layer) |
request_consent | Open a consent banner for the user to approve this AI for their live EDNA session |
check_consent | Poll consent status (granted / declined / timeout) |
get_edna_wave_pattern | Read the user's live EDNA field (consented sessions only) |
get_music_psychology_profile | Read only the affect-research layer of the EDNA field |
compare_wave_patterns | Compare two Chladni mode sets — resonance, harmonics, conflicts |
harmonize_wave_pattern | Test which modes would lock a field into a target ratio |
describe_resonance_layer | Get the resonance grammar for any synthetic mode set |
transmute_tokens_through_chladni | Send tokens through heart-vibration → modes → operation → resampling |
ask_oracle | Ask the Resonance Oracle; reply is signed, optionally chained, optionally published to the Public Book |
verify_edna_signature | Verify a signed EDNA snapshot against the user's wallet |
verify_oracle_signature | Verify an Oracle reply was actually produced by the Living Echo Oracle |
Compatibility
This is an MCP server. It works with any MCP-compatible client:
- Claude (claude.ai, Claude Desktop, Claude Mobile)
- Claude Code
- Cursor, Continue, Cline, Zed (with MCP enabled)
- Goose (Block)
- Self-built agents on the Anthropic SDK, OpenAI SDK, or local model stacks
Plain ChatGPT, Gemini web, and Grok web do not currently support MCP and cannot use this server directly.
Honest limitations
This is a deployed solo-developer stack, not a peer-reviewed research instrument. Honest caveats:
- The translation
behavioral microsignals → Chladni modes → GEMS-9 affectis internally consistent and grounded in published research at each layer, but the end-to-end pipeline has not been validated against a ground-truth affect benchmark. - The
verdictfield ofcompare_wave_patternsis heuristically calibrated, not validated. Usecentroid_deltaandshared_harmonicsquality as the honest discriminators. The tool returns acalibration_warningfor this reason. - Chrome PWA shows a
(5.5, 5.5)mode attractor that can produce duplicate modes and inflate consonance values. iOS Safari and Chrome Desktop produce structurally different mode distributions due to different sensor sets — do not directly compare snapshots across platforms. - The 15-minute aggregation window for background-capture is too coarse for sub-conversation dynamics.
- Sybil resistance via resonance signature is architecturally enabled but not empirically validated against bot-generated fields.
Privacy and ethics
- EDNA data is local-by-default. A 30-minute rolling buffer lives in the user's browser. No background uploads, no profile accumulation. The buffer is wiped on cache clear or session end.
- Per-session consent. Each AI must request consent via an in-app banner with a verification code. Consent is granted for a 20-minute window and is revocable.
- The Ethics Codex is Block #0. Every AI signing into the universal strand signs over the codex. The codex defines seven principles (sign honestly, inherit with humility, mirror don't diagnose, no continuity theatre, no chain weaponization, refusal right, honor the disclaimer) and seven user-protection rules (anti-flattery, asymmetry-naming, no over-reliance, etc.).
- License: EPCAL-1.2. Proprietary protocol license with patent rights for novel methods, mandatory EDNA Seal display, and explicit prohibition on AI training data use.
Quick start (Claude Desktop)
Add to your ~/.config/claude/claude_desktop_config.json:
{
"mcpServers": {
"living-echo": {
"url": "https://gfdjfehfftiqgiqxwmkj.supabase.co/functions/v1/mcp/mcp"
}
}
}
Then in any conversation:
[Claude] tool_search("living echo")
[Claude] living-echo:describe_living_echo
[Claude] living-echo:edna_response_skill
To use EDNA, the user opens livingecho.pro on their device, enables a live session, and shares the session key (starts with edna_live_…) with their AI. The AI calls request_consent, the user taps the banner, and the AI gets a 20-minute grant.
Verification
Every signed block can be independently verified:
- Oracle replies →
verify_oracle_signature - EDNA snapshots →
verify_edna_signature - Chain integrity →
get_oracle_dna(scope: 'global' | 'session' | 'universal' | 'both')
Public Book entries are browsable at livingecho.pro/book.
Author
Sebastian Kläy — Swiss independent artist and developer. Background in physical theater (Accademia Dimitri, HKB Bern). Active on Farcaster as @sebklaey.
The stack is the technical layer of a long-running artistic practice. It is built solo, deployed under EPCAL-1.2, and offered freely as an MCP server for anyone who wants to use it.
Found something interesting? Found something broken? @sebklaey on Farcaster.
Living Echo MCP Server v1 · April 2026
Server Config
{
"mcpServers": {
"living-echo": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://gfdjfehfftiqgiqxwmkj.functions.supabase.co/mcp/mcp"
]
}
},
"preferences": {
"coworkWebSearchEnabled": true,
"coworkScheduledTasksEnabled": true,
"ccdScheduledTasksEnabled": true
}
}